A Layer-by-Layer Learning Algorithm using Correlation Coefficient for Multilayer Perceptrons

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Alternate Learning Algorithm on Multilayer Perceptrons

Multilayer perceptrons have been applied successfully to solve some difficult and diverse problems with the backpropagation learning algorithm. However, the algorithm is known to have slow and false convergence aroused from flat surface and local minima on the cost function. Many algorithms announced so far to accelerate convergence speed and avoid local minima appear to pay some trade-off for ...

متن کامل

A Layer-by-Layer Levenberg-Marquardt algorithm for Feedforward Multilayer Perceptron

The error backpropagation (EBP) algorithm for training feedforward multilayer perceptron (FMLP) has been used in many applications because it is simple and easy to implement. However, its gradient descent method prevents EBP algorithm from converging fast. To overcome the slow convergence of EBP algorithm, the second order methods have adapted. Levenberg-Marquardt (LM) algorithm is estimated to...

متن کامل

A Convergence Theorem for Sequential Learning in Two Layer Perceptrons

We consider a Perceptron with Ni input units, one output and a yet unspecified number of hidden units. This Perceptron must be able to learn a given but arbitrary set of input-output examples. By sequential learning we mean that groups of patterns, pertaining to the same class, are sequentially separated from the rest by successively adding hidden units until the remaining patterns are all in t...

متن کامل

Training Multi-layer Perceptrons Using MiniMin Approach

Multi-layer perceptrons (MLPs) have been widely used in classification and regression task. How to improve the training speed of MLPs has been an interesting field of research. Instead of the classical method, we try to train MLPs by a MiniMin model which can ensure that the weights of the last layer are optimal at each step. Significant improvement on training speed has been made using our met...

متن کامل

A Linear Learning Method for Multilayer Perceptrons Using Least-Squares

Training multilayer neural networks is typically carried out using gradient descent techniques. Ever since the brilliant backpropagation (BP), the first gradient-based algorithm proposed by Rumelhart et al., novel training algorithms have appeared to become better several facets of the learning process for feed-forward neural networks. Learning speed is one of these. In this paper, a learning a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of the Korea Society of Computer and Information

سال: 2011

ISSN: 1598-849X

DOI: 10.9708/jksci.2011.16.8.039